On Ordinal VC-Dimension and Some Notions of Complexity
نویسندگان
چکیده
We generalize the classical notion of VC-dimension to ordinal VC-dimension, in the context of logical learning paradigms. Logical learning paradigms encompass the numerical learning paradigms commonly studied in Inductive inference. A logical learning paradigm is defined as a set W of structures over some vocabulary, and a set D of first-order formulas that represent data. The sets of models of φ in W, where φ varies over D, generate a natural topology W over W. We show that if D is closed under boolean operators, then the notion of ordinal VC-dimension offers a perfect characterization for the problem of predicting the truth of the members of D in a member of W, with an ordinal bound on the number of mistakes. This shows that the notion of VC-dimension has a natural interpretation in Inductive Inference, when cast into a logical setting. We also study the relationships between predictive complexity, selective complexity—a variation on predictive complexity—and mind change complexity. The assumptions that D is closed under boolean operators and that W is compact often play a crucial role to establish connections between these concepts.
منابع مشابه
Sauer's Bound for a Notion of Teaching Complexity
This paper establishes an upper bound on the size of a concept class with given recursive teaching dimension (RTD, a teaching complexity parameter.) The upper bound coincides with Sauer’s well-known bound on classes with a fixed VC-dimension. Our result thus supports the recently emerging conjecture that the combinatorics of VC-dimension and those of teaching complexity are intrinsically interl...
متن کاملAlmost-everywhere Algorithmic Stability and Generalization Error
We introduce a new notion of algorithmic stability, which we call training stability. We show that training stability is sufficient for good bounds on generalization error. These bounds hold even when the learner has infinite VC dimension. In the PAC setting, training stability gives necessary and sufficient conditions for exponential convergence, and thus serves as a distribution-dependent ana...
متن کاملPopper, Falsification and the VC-dimension
We compare Sir Karl Popper’s ideas concerning the falsifiability of a theory with similar notions from VC-theory. Having located some divergences, we discuss how best to view Popper’s work from the perspective of statistical learning theory.
متن کاملAlgorithmic Stability 3 4 Regularization Algorithms in an RKHS
In the last few lectures we have seen a number of different generalization error bounds for learning algorithms, using notions such as the growth function and VC dimension; covering numbers, pseudo-dimension, and fatshattering dimension; margins; and Rademacher averages. While these bounds are different in nature and apply in different contexts, a unifying factor that they all share is that tha...
متن کاملRecursive teaching dimension, VC-dimension and sample compression
This paper is concerned with various combinatorial parameters of classes that can be learned from a small set of examples. We show that the recursive teaching dimension, recently introduced by Zilles et al. (2008), is strongly connected to known complexity notions in machine learning, e.g., the self-directed learning complexity and the VC-dimension. To the best of our knowledge these are the fi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Theor. Comput. Sci.
دوره 364 شماره
صفحات -
تاریخ انتشار 2003